Recurrent Neural Network Embedding for Knowledge-base Completion

نویسنده

  • Yuxing Zhang
چکیده

Knowledge can often be represented using entities connected by relations. For example, the fact that tennis ball is round can be represented as “TennisBall HasShape Round”, where a “TennisBall” is one entity, “HasShape” is a relation and “Round” is another entity. A knowledge base is a way to store such structured information, a knowledge base stores triples of the “an entity-relation-an entity” form, and a real world knowledge base often has millions or billions of such triples. There are several well-known knowledge bases including FreeBase [1], WordNet [2], YAGO [3], etc. They are important in fields like reasoning and question answering; for instance if one asks “what is the shape of a tennis ball”, we can search the knowledge base for the triple as “TennisBall HasShape Round” and output “round” as the answer.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Neuron Mathematical Model Representation of Neural Tensor Network for RDF Knowledge Base Completion

In this paper, a state-of-the-art neuron mathematical model of neural tensor network (NTN) is proposed to RDF knowledge base completion problem. One of the difficulties with the parameter of the network is that representation of its neuron mathematical model is not possible. For this reason, a new representation of this network is suggested that solves this difficulty. In the representation, th...

متن کامل

Knowledge-Based Semantic Embedding for Machine Translation

In this paper, with the help of knowledge base, we build and formulate a semantic space to connect the source and target languages, and apply it to the sequence-to-sequence framework to propose a Knowledge-Based Semantic Embedding (KBSE) method. In our KBSE method, the source sentence is firstly mapped into a knowledge based semantic space, and the target sentence is generated using a recurrent...

متن کامل

A Novel Embedding Model for Knowledge Base Completion Based on Convolutional Neural Network

We introduce a novel embedding method for knowledge base completion task. Our approach advances state-of-the-art (SOTA) by employing a convolutional neural network (CNN) for the task which can capture global relationships and transitional characteristics. We represent each triple (head entity, relation, tail entity) as a 3-column matrix which is the input for the convolution layer. Different fi...

متن کامل

Compositional Vector Space Models for Knowledge Base Completion

Knowledge base (KB) completion adds new facts to a KB by making inferences from existing facts, for example by inferring with high likelihood nationality(X,Y) from bornIn(X,Y). Most previous methods infer simple one-hop relational synonyms like this, or use as evidence a multi-hop relational path treated as an atomic feature, like bornIn(X,Z)→ containedIn(Z,Y). This paper presents an approach t...

متن کامل

Word Sense Disambiguation with Recurrent Neural Networks

This paper presents a neural network architecture for word sense disambiguation (WSD). The architecture employs recurrent neural layers and more specifically LSTM cells, in order to capture information about word order and to easily incorporate distributed word representations (embeddings) as features, without having to use a fixed window of text. The paper demonstrates that the architecture is...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016